Antigoni Kaliontzopoulou, CIBIO/InBIO, University of Porto
12 October, 2019
\(\small{log(Y)}=log(\beta)+\alpha{log(x)}\)
Note: concepts of positive and negative allometry are only applicable to univariate traits! (for multivariate traits, allometry is a vector in multi-dimensional space where ‘positive’ and ‘negative’ as directions are not defined and have no meaning)
Also, it is mathematically impossible for all body parts to grow simultaneously faster (positive) or slower (negative) than total body size
Here is isometry…
Here is isometry… and here is allometry
To characterize allometric patterns, we fit a model of shape~size as:
\[\small\mathbf{Y}=\mathbf{X}\mathbf{\beta } +\mathbf{E}\]
To characterize allometric patterns, we fit a model of shape~size as:
\[\small\mathbf{Y}=\mathbf{X}\mathbf{\beta } +\mathbf{E}\]
We evaluate the allometry model \(\small\mathbf{X}_{F}\) by comparing it to an intercept model \(\small\mathbf{X}_{R}\):
\(\tiny\mathbf{X}_R = \begin{bmatrix} 1\\ 1\\ 1\\ 1\\ 1\\ 1 \end{bmatrix}\) & \(\tiny\mathbf{X}_F = \begin{bmatrix} 1 & 0.3 \\ 1 & 0.5 \\ 1 & 0.2 \\ 1 & 1.2 \\ 1 & 0.7 \\ 1 & 1.1 \end{bmatrix}\)
To characterize allometric patterns, we fit a model of shape~size as:
\[\small\mathbf{Y}=\mathbf{X}\mathbf{\beta } +\mathbf{E}\]
We evaluate the allometry model \(\small\mathbf{X}_{F}\) by comparing it to an intercept model \(\small\mathbf{X}_{R}\):
\(\tiny\mathbf{X}_R = \begin{bmatrix} 1\\ 1\\ 1\\ 1\\ 1\\ 1 \end{bmatrix}\) & \(\tiny\mathbf{X}_F = \begin{bmatrix} 1 & 0.3 \\ 1 & 0.5 \\ 1 & 0.2 \\ 1 & 1.2 \\ 1 & 0.7 \\ 1 & 1.1 \end{bmatrix}\)
| Estimate | \(\small\mathbf{X}_{R}\) | \(\small\mathbf{X}_{F}\) |
|---|---|---|
| Coefficients | \(\tiny\hat{\mathbf{\beta_R}}=\left ( \mathbf{X}_R^{T} \mathbf{X}_R\right )^{-1}\left ( \mathbf{X}_R^{T} \mathbf{Y}\right )\) | \(\tiny\hat{\mathbf{\beta_F}}=\left ( \mathbf{X}_F^{T} \mathbf{X}_F\right )^{-1}\left ( \mathbf{X}_F^{T} \mathbf{Y}\right )\) |
| Predicted Values | \(\small\hat{\mathbf{Y}}_R=\mathbf{X}_R\hat{\mathbf{\beta}}_R\) | \(\small\hat{\mathbf{Y}}_F=\mathbf{X}_F\hat{\mathbf{\beta}}_F\) |
| Model Residuals | \(\small\hat{\mathbf{E}}_R=\mathbf{Y}-\hat{\mathbf{Y}}_R\) | \(\small\hat{\mathbf{E}}_F=\mathbf{Y}-\hat{\mathbf{Y}}_F\) |
| Model Residual Error (\(\small{SSE}\)) | \(\small\mathbf{S}_R=\hat{\mathbf{E}}_R^T\hat{\mathbf{E}}_R\) | \(\small\mathbf{S}_F=\hat{\mathbf{E}}_F^T\hat{\mathbf{E}}_F\) |
1: Fit \(\small\mathbf{X}_{R}\) for each \(\small\mathbf{X}_{F}\); Estimate \(\small\hat{\mathbf{Y}}_{R}\) and \(\small\mathbf{E}_{R}\)
2: Permute, \(\small\mathbf{E}_{R}\): obtain pseudo-values as: \(\small\mathbf{\mathcal{Y}} = \mathbf{\hat{Y}}_{R} + \mathbf{E}_{R}\)
3: Fit \(\small\mathbf{X}_{F}\) using \(\small\mathbf{\mathcal{Y}}\): obtain coefficients and summary statistics
4: Calculate \(\small{F}\)-value in every random permutation (observed case counts as one permutation)
5: For \(\small{N}\) permutations, \(\small{P} = \frac{N(F_{random} \geq F_{obs})}{N}\)
6: Calculate effect size as a standard deviate of the observed value in a normalized distribution of random values (helps for comparing effects within and between models); i.e.: \[\small{z} = \frac{ \log\left( F\right) - \mu_{\log\left(F\right)} } { \sigma_{\log\left(F\right)} }\] where \(\small\mu_{\log\left(F\right)}\) and \(\sigma_{\log\left(F\right)}\) are the expected value and standard deviation from the sampling distribution, respectively.
Does body shape covary with size in Pecos pupfish?
This is a hypothesis of Allometry (shape~size covariation)
data(Pupfish)
Pupfish$logSize <- log(Pupfish$CS)
fit <- procD.lm(coords ~ logSize, SS.type = "I",
data = Pupfish, print.progress = FALSE, iter = 999)
anova(fit)##
## Analysis of Variance, using Residual Randomization
## Permutation procedure: Randomization of null model residuals
## Number of permutations: 1000
## Estimation method: Ordinary Least Squares
## Sums of Squares and Cross-products: Type I
## Effect sizes (Z) based on F distributions
##
## Df SS MS Rsq F Z Pr(>F)
## logSize 1 0.014019 0.0140193 0.24886 17.229 4.4621 0.001 **
## Residuals 52 0.042314 0.0008137 0.75114
## Total 53 0.056333
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Call: procD.lm(f1 = coords ~ logSize, iter = 999, SS.type = "I", data = Pupfish,
## print.progress = FALSE)
Several (complementary) solutions:
##
## Analysis of Variance, using Residual Randomization
## Permutation procedure: Randomization of null model residuals
## Number of permutations: 1000
## Estimation method: Ordinary Least Squares
## Sums of Squares and Cross-products: Type I
## Effect sizes (Z) based on F distributions
##
## Df SS MS Rsq F Z Pr(>F)
## CS 1 0.05814 0.058139 0.09503 40.0581 7.7415 0.001 **
## type 1 0.03235 0.032350 0.05287 22.2892 6.6662 0.001 **
## CS:type 1 0.00610 0.006105 0.00998 4.2061 3.3590 0.001 **
## Residuals 355 0.51524 0.001451 0.84212
## Total 358 0.61183
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Call: procD.lm(f1 = Y.gpa$coords ~ CS * type, data = gdf, print.progress = FALSE)
\[\small\alpha=(\mathbf{Y}_{cent}^T\mathbf{CS})/\mathbf{(CS^TCS)}\] \[\small{CAC}=\mathbf{Y}_{cent}\alpha\] where \[\small\alpha=\alpha\sqrt{}(\alpha^T\alpha)\]
\[\small\alpha=(\mathbf{Y}_{cent}^T\mathbf{CS})/\mathbf{(CS^TCS)}\] \[\small{CAC}=\mathbf{Y}_{cent}\alpha\] \[\small\alpha=\alpha\sqrt{}(\alpha^T\alpha)\] \[\small\mathbf{s}=\mathbf{Y\beta^T(\beta^T\beta)^{-1/2}}\]
data(pupfish)
plotAllSpecimens(pupfish$coords)pupfish$logSize <- log(pupfish$CS)
pupfish$Group <- interaction(pupfish$Pop, pupfish$Sex)
fit.common <- procD.lm(coords ~ logSize + Group,
data = pupfish, print.progress = FALSE)
fit.unique <- procD.lm(coords ~ logSize * Group,
data = pupfish, print.progress = FALSE) Homogeneity of slopes test
anova(fit.unique, fit.null = fit.common, print.progress = FALSE)##
## Analysis of Variance, using Residual Randomization
## Permutation procedure: Randomization of null model residuals
## Number of permutations: 1000
## Estimation method: Ordinary Least Squares
## Effect sizes (Z) based on F distributions
##
## ResDf Df RSS SS MS
## coords ~ logSize * Group (Null) 46 1 0.022079
## coords ~ logSize + Group 49 -3 0.024084 -0.0020045 0.00066816
## Total 53 0.056333
## Rsq F Z P Pr(>F)
## coords ~ logSize * Group (Null) 0.000000
## coords ~ logSize + Group -0.035583 1.3594 -2.2247 0.988
## Total
Results suggest that allometric slopes are parallel. We can see by plotting the prediction lines for both that a unique-allometries model is not practical.
Allometric plots
par(mfcol = c(1,2))
plotAllometry(fit.common, pupfish$CS, logsz = TRUE, method = "PredLine", pch = 21,
bg = pupfish$Group)
plotAllometry(fit.unique, pupfish$CS, logsz = TRUE, method = "PredLine", pch = 21,
bg = pupfish$Group)Hypothesis tests: vector lengths
PW <- pairwise(fit.unique, groups = pupfish$Group,
covariate = log(pupfish$CS), print.progress = FALSE)
summary(PW, test = "dist")##
## Pairwise comparisons
##
## Groups: Marsh.F Sinkhole.F Marsh.M Sinkhole.M
##
## RRPP: 1000 permutations
##
## Slopes (vectors of variate change per one unit of covariate change, by group):
## Vectors hidden (use show.vectors = TRUE to view)
##
## Slope vector lengths
## Marsh.F Sinkhole.F Marsh.M Sinkhole.M
## 0.09846857 0.08048951 0.08899635 0.13005765
##
## Pairwise absolute difference (d) between vector lengths, plus statistics
## d UCL (95%) Z Pr > d
## Marsh.F:Sinkhole.F 0.017979061 0.08701897 -0.6684921 0.691
## Marsh.F:Marsh.M 0.009472222 0.08066795 -0.9266208 0.805
## Marsh.F:Sinkhole.M 0.031589079 0.08573929 -0.1404949 0.485
## Sinkhole.F:Marsh.M 0.008506839 0.04838718 -0.7095093 0.706
## Sinkhole.F:Sinkhole.M 0.049568140 0.04244181 2.4759957 0.023
## Marsh.M:Sinkhole.M 0.041061301 0.04776048 1.3887438 0.093
Hypothesis tests: angles between vectors
summary(PW, test = "VC", angle.type = "deg") ##
## Pairwise comparisons
##
## Groups: Marsh.F Sinkhole.F Marsh.M Sinkhole.M
##
## RRPP: 1000 permutations
##
## Slopes (vectors of variate change per one unit of covariate change, by group):
## Vectors hidden (use show.vectors = TRUE to view)
##
## Pairwise statistics based on slopes vector correlations (r) and angles, acos(r)
## The null hypothesis is that r = 1 (parallel vectors).
## This null hypothesis is better treated as the angle between vectors = 0
## r angle UCL (95%) Z Pr > angle
## Marsh.F:Sinkhole.F 0.6318267 50.81498 82.18877 -0.2229753 0.538
## Marsh.F:Marsh.M 0.6139591 52.12367 85.99703 -0.2996071 0.582
## Marsh.F:Sinkhole.M 0.4461018 63.50614 82.21052 0.5581365 0.260
## Sinkhole.F:Marsh.M 0.7175129 44.15048 65.21340 0.1341814 0.395
## Sinkhole.F:Sinkhole.M 0.4627734 62.43378 59.87174 2.1810996 0.036
## Marsh.M:Sinkhole.M 0.5629975 55.73665 66.72786 1.0332053 0.146
Conclusion: allometries do not significantly diverge. But concluding they are parallel allows one to compare means, accounting for common allometric variation.
PW <- pairwise(fit.common, groups = pupfish$Group,
print.progress = FALSE)
summary(PW, test = "dist") ##
## Pairwise comparisons
##
## Groups: Marsh.F Sinkhole.F Marsh.M Sinkhole.M
##
## RRPP: 1000 permutations
##
## LS means:
## Vectors hidden (use show.vectors = TRUE to view)
##
## Pairwise distances between means, plus statistics
## d UCL (95%) Z Pr > d
## Marsh.F:Sinkhole.F 0.03639758 0.01731029 7.510885 0.001
## Marsh.F:Marsh.M 0.03579214 0.01849114 6.875381 0.001
## Marsh.F:Sinkhole.M 0.03809715 0.01593951 9.197696 0.001
## Sinkhole.F:Marsh.M 0.03681731 0.02334774 4.868793 0.001
## Sinkhole.F:Sinkhole.M 0.01939739 0.01845710 2.166170 0.033
## Marsh.M:Sinkhole.M 0.02694896 0.01879110 4.077037 0.001